A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
نویسندگان
چکیده
In many statistical learning problems, it is desired that the optimal solution conform to an a priori known sparsity structure represented by directed acyclic graph. Inducing such structures means of convex regularizers requires nonsmooth penalty functions exploit group overlapping. Our study focuses on evaluating proximal operator latent overlapping lasso developed Jacob et al. in 2009. We implemented alternating direction method multiplier with sharing scheme solve large-scale instances underlying optimization problem efficiently. absence strong convexity, global linear convergence algorithm established using error bound theory. More specifically, paper contributes establishing primal and dual bounds when component objective function does not have polyhedral epigraph. also investigate effect graph speed algorithm. Detailed numerical simulation studies over different supporting proposed two applications are provided. Summary Contribution: The proposes computationally efficient evaluate hierarchical sparsity-inducing regularizer establishes its properties. intensive subproblem can be fully parallelized, which allows solving problem. Comprehensive benchmarking against five other methods optimality Furthermore, performance demonstrated related topic modeling breast cancer classification. code along benchmarks available corresponding author’s GitHub website for evaluation future use.
منابع مشابه
A Hybrid Optimization Algorithm for Learning Deep Models
Deep learning is one of the subsets of machine learning that is widely used in Artificial Intelligence (AI) field such as natural language processing and machine vision. The learning algorithms require optimization in multiple aspects. Generally, model-based inferences need to solve an optimized problem. In deep learning, the most important problem that can be solved by optimization is neural n...
متن کاملA Hybrid Optimization Algorithm for Learning Deep Models
Deep learning is one of the subsets of machine learning that is widely used in Artificial Intelligence (AI) field such as natural language processing and machine vision. The learning algorithms require optimization in multiple aspects. Generally, model-based inferences need to solve an optimized problem. In deep learning, the most important problem that can be solved by optimization is neural n...
متن کاملLearning First Order Logic Rules with a Genetic Algorithm
This paper introduces a new algorithm called SIAO1 for learning first order logic rules with genetic algorithms. SIAO1 uses the covering principle developed in AQ where seed examples are generalized into rules using however a genetic search, as initially introduced in the SIA algorithm for attribute-based representation. The genetic algorithm uses a high level representation for learning rules ...
متن کاملLearning with Sparsity: Structures, Optimization and Applications
The development of modern information technology has enabled collecting data of unprecedented size and complexity. Examples include web text data, microarray & proteomics, and data from scientific domains (e.g., meteorology). To learn from these high dimensional and complex data, traditional machine learning techniques often suffer from the curse of dimensionality and unaffordable computational...
متن کاملA Discrete Hybrid Teaching-Learning-Based Optimization algorithm for optimization of space trusses
In this study, to enhance the optimization process, especially in the structural engineering field two well-known algorithms are merged together in order to achieve an improved hybrid algorithm. These two algorithms are Teaching-Learning Based Optimization (TLBO) and Harmony Search (HS) which have been used by most researchers in varied fields of science. The hybridized algorithm is called A Di...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Informs Journal on Computing
سال: 2021
ISSN: ['1091-9856', '1526-5528']
DOI: https://doi.org/10.1287/ijoc.2021.1069